Asymptotic Law of Likelihood Ratio for Multilayer Perceptron Models

نویسنده

  • Joseph Rynkiewicz
چکیده

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The data are assumed to be generated by a true MLP model and the estimation of the parameters of the MLP is done by maximizing the likelihood of the model. When the number of hidden units of the true model is known, the asymptotic distribution of the likelihood ratio (LR) statistic is easy to compute and converge to a χ2 law. However, if the number of hidden unit is over-estimated the Fischer information matrix of the model is singular and the asymptotic behavior of the LR statistic is unknown or can be divergent if the set of possible parameter is too large. This paper deals with this case, and gives the exact asymptotic law of the LR statistic. Namely, if the parameters of the MLP lie in a suitable compact set, we show that the LR statistic converges to the maximum of the square of a Gaussian process indexed by a class of limit score functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Theory of Locally Conic Models and its Application to Multilayer Neural Networks

This paper discusses the maximum likelihood estimation in a statistical model with unidentiÞability, using the framework of conic singularity. The likelihood ratio may diverge in unidentiÞable cases, though in regular cases it converges to a χ distribution. A useful sufficient condition of such divergence is obtained, and is applied to neural networks. The exact order for multilayer perceptrons...

متن کامل

Statistical Analysis of UnidentiÞable Models and its Application to Multilayer Neural Networks

This paper discusses the maximum likelihood estimator of the parametric model that has lack of identiÞability in low dimensional subsets in the parameter space. Among many statistical models with unidentiÞability, neural network models are the main concern of this paper. The unidentiÞable true parameter is formulated as a conic singularity of the model embedded in an inÞnite dimensional space o...

متن کامل

Asymptotic properties of mixture-of-experts models

Abstract. The statistical properties of the likelihood ratio test statistic (LRTS) for mixture-of-expert models are addressed in this paper. This question is essential when estimating the number of experts in the model. Our purpose is to extend the existing results for mixtures (Liu and Shao, 2003) and mixtures of multilayer perceptrons (Olteanu and Rynkiewicz, 2008). In this paper we study a s...

متن کامل

Efficient estimation of multidimensional regression model using multilayer perceptrons

This work concerns the estimation of multidimensional nonlinear regression models using multilayer perceptrons (MLPs). The main problem with such models is that we need to know the covariance matrix of the noise to get an optimal estimator. However, we show in this paper that if we choose as the cost function the logarithm of the determinant of the empirical error covariance matrix, then we get...

متن کامل

Parametric Bootstrap for Test of Contrast Difference in Neural Networks

This work concernes the contrast difference test and its asymptotic properties for non linear auto-regressive models. Our approach is based on an application of the parametric bootstrap method. It is a re-sampling method based on the estimate parameters of the models. The resulting methodology is illustrated by simulations of multilayer perceptron models, and an asymptotic justification is give...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008